Fetal facial expression in response to intravaginal music emission
نویسندگان
چکیده
This study compared fetal response to musical stimuli applied intravaginally (intravaginal music [IVM]) with application via emitters placed on the mother's abdomen (abdominal music [ABM]). Responses were quantified by recording facial movements identified on 3D/4D ultrasound. One hundred and six normal pregnancies between 14 and 39 weeks of gestation were randomized to 3D/4D ultrasound with: (a) ABM with standard headphones (flute monody at 98.6 dB); (b) IVM with a specially designed device emitting the same monody at 53.7 dB; or (c) intravaginal vibration (IVV; 125 Hz) at 68 dB with the same device. Facial movements were quantified at baseline, during stimulation, and for 5 minutes after stimulation was discontinued. In fetuses at a gestational age of >16 weeks, IVM-elicited mouthing (MT) and tongue expulsion (TE) in 86.7% and 46.6% of fetuses, respectively, with significant differences when compared with ABM and IVV (p = 0.002 and p = 0.004, respectively). There were no changes from baseline in ABM and IVV. TE occurred ≥5 times in 5 minutes in 13.3% with IVM. IVM was related with higher occurrence of MT (odds ratio = 10.980; 95% confidence interval = 3.105-47.546) and TE (odds ratio = 10.943; 95% confidence interval = 2.568-77.037). The frequency of TE with IVM increased significantly with gestational age (p = 0.024). Fetuses at 16-39 weeks of gestation respond to intravaginally emitted music with repetitive MT and TE movements not observed with ABM or IVV. Our findings suggest that neural pathways participating in the auditory-motor system are developed as early as gestational week 16. These findings might contribute to diagnostic methods for prenatal hearing screening, and research into fetal neurological stimulation.
منابع مشابه
The Effect of Vibroacoustic Stimulation and Music on Fetal Movement
Introduction Fetal movement started at the 7th weeks of pregnancy and by the end of pregnancy will gradually be perfect and harmonious. Near-term fetuses can discriminate acoustic features, such as frequencies and spectra, and process complex auditory streams. In this study, we aimed to evaluate fatal movement in response to music and vibration stimulation. Materials and Methods This study is a...
متن کاملThe Effect of Music on Fetus Movement During Non-Stress Test
Introduction: To reduce mortality occurrence on the day of birth, it is recommended to evaluate the health of the fetus during pregnancy. The most widely used technique in most centers as the ideal screening for fetal health assessment is non-stress test. Due to the fact that reducing fetal movement is one of the immediate symptoms of fetal death, this study was conducted to determine the effec...
متن کاملThe Effect of Relaxing Music on the Life Distress and Maternal-fetal Attachment of Pregnant Women
Background: Although pregnancy and motherhood are enjoyable experiences, they are associated with many physical and psychological changes requiring adaptation. The present study aimed to assess the effectiveness of relaxing music on the life distress and maternal-fetal attachment of pregnant women. Methods: It was a quasi-experimental study with pre-test and post-test design and a control gr...
متن کاملEmotionface: Prototype Facial Expression Display of Emotion in Music
EmotionFace is a software interface for visually displaying the self-reported emotion expressed by music. Taken in reverse, it can be viewed as a facial expression whose auditory connection or exemplar is the time synchronized, associated music. The present instantiation of the software uses a simple schematic face with eyes and mouth moving according to a parabolic model: Smiling and frowning ...
متن کاملVisualization of music impression in facial expression to represent emotion
In this paper, we propose a visualization method of music impression in facial expression to represent emotion. We apply facial expression to represent the complicated and mixed emotions. This method can generate facial expression corresponding to impressions of music data by measurement of relationship between each basic emotion for facial expression and impressions extracted from music data. ...
متن کامل